# ChatML Format Support
Gryphe Codex 24B Small 3.2 GGUF
Apache-2.0
This is a quantized version of Gryphe's Codex-24B-Small-3.2 model, which optimizes the running efficiency under different hardware conditions through quantization technology.
Large Language Model English
G
bartowski
626
3
Archaeo 12B GGUF
Archaeo-12B is a 12B-parameter large language model specifically designed for role-playing and creative writing, obtained by merging the Rei-12B and Francois-Huali-12B models
Large Language Model
A
Delta-Vector
125
12
Archaeo 12B
A fusion model specifically designed for role-playing and creative writing, combining Rei-12B and Francois-Huali-12B via Slerp algorithm
Large Language Model
Transformers

A
Delta-Vector
168
12
EVA Qwen2.5 32B V0.2
Apache-2.0
A specialized model focused on role-playing/story writing, fully fine-tuned based on Qwen2.5-32B, integrating synthetic and natural data.
Large Language Model
Transformers

E
EVA-UNIT-01
625
53
Pygmalion 3 12B
Apache-2.0
Pygmalion-3 12B is an open-source role-playing large model fine-tuned based on Mistral-Nemo-Base-2407-chatml, focusing on generating creative fictional dialogue content.
Large Language Model
Transformers

P
PygmalionAI
741
43
Whiterabbitneo 2.5 Qwen 2.5 Coder 7B
Apache-2.0
A cybersecurity-focused code generation model fine-tuned on Qwen2.5-Coder-7B, specializing in attack-defense security scenarios
Large Language Model
Transformers Supports Multiple Languages

W
WhiteRabbitNeo
385
54
Orca Mini V6 8b
Llama 3 8B parameter-scale language model trained on various SFT datasets, supporting text generation tasks
Large Language Model
Transformers English

O
pankajmathur
21
2
J.O.S.I.E.3 Beta12 7B Slerp
Apache-2.0
J.O.S.I.E.3-Beta12-7B-slerp is a 7B-parameter large language model created by merging Weyaxi/Einstein-v6-7B and argilla/CapybaraHermes-2.5-Mistral-7B models, supporting multilingual interaction and adopting the ChatML prompt format.
Large Language Model
Transformers Supports Multiple Languages

J
Goekdeniz-Guelmez
17
2
Minueza 32M Chat
Apache-2.0
Minueza-32M-Chat is a chat model with 32 million parameters, based on Felladrin/Minueza-32M-Base and trained with supervised fine-tuning (SFT) and direct preference optimization (DPO).
Large Language Model
Transformers English

M
Felladrin
77
9
Discolm German 7b V1 AWQ
Apache-2.0
DiscoLM German 7B v1 is a 7B-parameter German language model based on the Mistral architecture, supporting both German and English, and released under the Apache-2.0 license.
Large Language Model
Transformers Supports Multiple Languages

D
TheBloke
81
4
Noromaid 7B 0.4 DPO
A 7B-parameter large language model co-created by IkariDev and Undi, optimized with DPO training
Large Language Model
Transformers

N
NeverSleep
137
27
Discolm Mixtral 8x7b V2
Apache-2.0
Experimental 8x7b Mixture of Experts model developed based on Mistral AI's Mixtral 8x7b, fine-tuned on Synthia, MetaMathQA, and Capybara datasets
Large Language Model
Transformers English

D
DiscoResearch
205
124
Writing Partner Mistral 7B
Apache-2.0
A Mistral fine-tuned model focused on assisting writing, serving as a creative writing partner to help overcome creative blocks
Large Language Model
Transformers English

W
FPHam
21
31
Openhermes 2.5 Mistral 7B GPTQ
Apache-2.0
OpenHermes 2.5 is an advanced language model fine-tuned based on Mistral-7B, focusing on code generation and general task processing, with performance surpassing previous versions.
Large Language Model
Transformers English

O
TheBloke
695
28
Featured Recommended AI Models